111 research outputs found

    Effective Edge-Fault-Tolerant Single-Source Spanners via Best (or Good) Swap Edges

    Full text link
    Computing \emph{all best swap edges} (ABSE) of a spanning tree TT of a given nn-vertex and mm-edge undirected and weighted graph GG means to select, for each edge ee of TT, a corresponding non-tree edge ff, in such a way that the tree obtained by replacing ee with ff enjoys some optimality criterion (which is naturally defined according to some objective function originally addressed by TT). Solving efficiently an ABSE problem is by now a classic algorithmic issue, since it conveys a very successful way of coping with a (transient) \emph{edge failure} in tree-based communication networks: just replace the failing edge with its respective swap edge, so as that the connectivity is promptly reestablished by minimizing the rerouting and set-up costs. In this paper, we solve the ABSE problem for the case in which TT is a \emph{single-source shortest-path tree} of GG, and our two selected swap criteria aim to minimize either the \emph{maximum} or the \emph{average stretch} in the swap tree of all the paths emanating from the source. Having these criteria in mind, the obtained structures can then be reviewed as \emph{edge-fault-tolerant single-source spanners}. For them, we propose two efficient algorithms running in O(mn+n2log⁥n)O(m n +n^2 \log n) and O(mnlog⁥ι(m,n))O(m n \log \alpha(m,n)) time, respectively, and we show that the guaranteed (either maximum or average, respectively) stretch factor is equal to 3, and this is tight. Moreover, for the maximum stretch, we also propose an almost linear O(mlog⁥ι(m,n))O(m \log \alpha(m,n)) time algorithm computing a set of \emph{good} swap edges, each of which will guarantee a relative approximation factor on the maximum stretch of 3/23/2 (tight) as opposed to that provided by the corresponding BSE. Surprisingly, no previous results were known for these two very natural swap problems.Comment: 15 pages, 4 figures, SIROCCO 201

    Reoptimization of Some Maximum Weight Induced Hereditary Subgraph Problems

    Get PDF
    The reoptimization issue studied in this paper can be described as follows: given an instance I of some problem Π, an optimal solution OPT for Π in I and an instance I′ resulting from a local perturbation of I that consists of insertions or removals of a small number of data, we wish to use OPT in order to solve Π in I', either optimally or by guaranteeing an approximation ratio better than that guaranteed by an ex nihilo computation and with running time better than that needed for such a computation. We use this setting in order to study weighted versions of several representatives of a broad class of problems known in the literature as maximum induced hereditary subgraph problems. The main problems studied are max independent set, max k-colorable subgraph and max split subgraph under vertex insertions and deletion

    Path-Fault-Tolerant Approximate Shortest-Path Trees

    Full text link
    Let G=(V,E)G=(V,E) be an nn-nodes non-negatively real-weighted undirected graph. In this paper we show how to enrich a {\em single-source shortest-path tree} (SPT) of GG with a \emph{sparse} set of \emph{auxiliary} edges selected from EE, in order to create a structure which tolerates effectively a \emph{path failure} in the SPT. This consists of a simultaneous fault of a set FF of at most ff adjacent edges along a shortest path emanating from the source, and it is recognized as one of the most frequent disruption in an SPT. We show that, for any integer parameter k≥1k \geq 1, it is possible to provide a very sparse (i.e., of size O(kn⋅f1+1/k)O(kn\cdot f^{1+1/k})) auxiliary structure that carefully approximates (i.e., within a stretch factor of (2k−1)(2∣F∣+1)(2k-1)(2|F|+1)) the true shortest paths from the source during the lifetime of the failure. Moreover, we show that our construction can be further refined to get a stretch factor of 33 and a size of O(nlog⁡n)O(n \log n) for the special case f=2f=2, and that it can be converted into a very efficient \emph{approximate-distance sensitivity oracle}, that allows to quickly (even in optimal time, if k=1k=1) reconstruct the shortest paths (w.r.t. our structure) from the source after a path failure, thus permitting to perform promptly the needed rerouting operations. Our structure compares favorably with previous known solutions, as we discuss in the paper, and moreover it is also very effective in practice, as we assess through a large set of experiments.Comment: 21 pages, 3 figures, SIROCCO 201

    Network Creation Games: Think Global - Act Local

    Full text link
    We investigate a non-cooperative game-theoretic model for the formation of communication networks by selfish agents. Each agent aims for a central position at minimum cost for creating edges. In particular, the general model (Fabrikant et al., PODC'03) became popular for studying the structure of the Internet or social networks. Despite its significance, locality in this game was first studied only recently (Bil\`o et al., SPAA'14), where a worst case locality model was presented, which came with a high efficiency loss in terms of quality of equilibria. Our main contribution is a new and more optimistic view on locality: agents are limited in their knowledge and actions to their local view ranges, but can probe different strategies and finally choose the best. We study the influence of our locality notion on the hardness of computing best responses, convergence to equilibria, and quality of equilibria. Moreover, we compare the strength of local versus non-local strategy-changes. Our results address the gap between the original model and the worst case locality variant. On the bright side, our efficiency results are in line with observations from the original model, yet we have a non-constant lower bound on the price of anarchy.Comment: An extended abstract of this paper has been accepted for publication in the proceedings of the 40th International Conference on Mathematical Foundations on Computer Scienc

    Augmenting graphs to minimize the diameter

    Full text link
    We study the problem of augmenting a weighted graph by inserting edges of bounded total cost while minimizing the diameter of the augmented graph. Our main result is an FPT 4-approximation algorithm for the problem.Comment: 15 pages, 3 figure

    Selfish Network Creation with Non-Uniform Edge Cost

    Full text link
    Network creation games investigate complex networks from a game-theoretic point of view. Based on the original model by Fabrikant et al. [PODC'03] many variants have been introduced. However, almost all versions have the drawback that edges are treated uniformly, i.e. every edge has the same cost and that this common parameter heavily influences the outcomes and the analysis of these games. We propose and analyze simple and natural parameter-free network creation games with non-uniform edge cost. Our models are inspired by social networks where the cost of forming a link is proportional to the popularity of the targeted node. Besides results on the complexity of computing a best response and on various properties of the sequential versions, we show that the most general version of our model has constant Price of Anarchy. To the best of our knowledge, this is the first proof of a constant Price of Anarchy for any network creation game.Comment: To appear at SAGT'1

    The sequential price of anarchy for atomic congestion games

    Get PDF
    In situations without central coordination, the price of anarchy relates the quality of any Nash equilibrium to the quality of a global optimum. Instead of assuming that all players choose their actions simultaneously, here we consider games where players choose their actions sequentially. The sequential price of anarchy, recently introduced by Paes Leme, Syrgkanis, and Tardos then relates the quality of any subgame perfect equilibrium to the quality of a global optimum. The effect of sequential decision making on the quality of equilibria, however, depends on the specific game under consideration.\ud Here we analyze the sequential price of anarchy for atomic congestion games with affine cost functions. We derive several lower and upper bounds, showing that sequential decisions mitigate the worst case outcomes known for the classical price of anarchy. Next to tight bounds on the sequential price of anarchy, a methodological contribution of our work is, among other things, a "factor revealing" integer linear programming approach that we use to solve the case of three players

    Effects of Different Up-Dosing Regimens for Hymenoptera Venom Immunotherapy on Serum CTLA-4 and IL-10

    Get PDF
    BACKGROUND: Cytotoxic T lymphocyte associated antigen-4 (CTLA-4) is involved in the activation pathways of T lymphocytes. It has been shown that the circulating form of CTLA-4 is elevated in patients with hymenoptera allergy and can be down regulated by immunotherapy. OBJECTIVE: to assess the effects on CTLA-4 of venom immunotherapy, given with different induction protocols: conventional (6 weeks), rush (3 days) or ultra rush (1 day). METHODS: Sera from patients with hymenoptera allergy were collected at baseline and at the end of the induction phase. CTLA-4 and IL-10 were assayed in the same samples. A subset of patients were assayed also after 12 months of VIT maintenance. RESULTS: Ninety-four patients were studied. Of them, 50 underwent the conventional induction, 20 the rush and 24 the ultra-rush. Soluble CTLA-4 was detectable in all patients at baseline, and significantly decreased at the end of the induction, irrespective of its duration. Of note, a significant decrease of sCTLA-4 could be seen already at 24 hours. In parallel, IL-10 significantly increased at the end of the induction. At 12 months, sCTLA-4 remained low, whereas IL-10 returned to the baseline values. CONCLUSIONS: Serum CTLA4 is an early marker of the immunological effects of venom immunotherapy, and its changes persist after one year of maintenance treatment

    Boolean Game with Prioritized Norms

    Get PDF
    In this paper we study boolean game with prioritized norms. Norms distinguish illegal strategies from legal strategies. Notions like legal strategy and legal Nash equilibrium are introduced. Our formal model is a combination of (weighted) boolean game and so called (prioritized) input/output logic. After formally presenting the model, we use examples to show that non-optimal Nash equilibrium can be avoided by making use of norms.We study various complexity issues related to legal strategy and legal Nash equilibrium
    • …
    corecore